MiniMax M2.1 Makes a Shocking Open Source Debut! A 10 Billion Parameter Sparse Architecture Model Tops SOTA, Outperforming Gemini3Pro and Claude 4.5 in Multilingual Programming
The domestic large model MiniMax opensources M2.1, achieving breakthroughs in multilingual programming, code generation, and tool calling with its 10 billion parameter sparse architecture. It surpasses closed-source flagship models such as Google and Anthropic in authoritative benchmark tests, marking a new stage in the performance of open source coding models.